compiler: Enhance IR to support more advanced parlang (CUDA/HIP/SYCL) features#2717
compiler: Enhance IR to support more advanced parlang (CUDA/HIP/SYCL) features#2717FabioLuporini wants to merge 18 commits intomainfrom
Conversation
901aca9 to
7ca18ee
Compare
7ca18ee to
af2339d
Compare
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #2717 +/- ##
==========================================
- Coverage 92.10% 92.08% -0.03%
==========================================
Files 248 248
Lines 49654 49739 +85
Branches 4368 4373 +5
==========================================
+ Hits 45734 45801 +67
- Misses 3213 3228 +15
- Partials 707 710 +3
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
mloubout
left a comment
There was a problem hiding this comment.
Minor comments but looks good
| return super().__mul__(other) | ||
|
|
||
|
|
||
| class Terminal: |
| self.tensor = tensor | ||
|
|
||
| def _hashable_content(self): | ||
| return super()._hashable_content() + (self.tensor,) |
There was a problem hiding this comment.
self.tensor._hashable_content() might be more efficient
There was a problem hiding this comment.
But couldn't that potentially cause key clashes if you somehow had both a FunctionMap and its tensor in the same mapper/set? I would say the current one is safer imo
There was a problem hiding this comment.
yes this is the way to go otherwise you might hash the exact same as a pure tensor
|
|
||
| is_Array = True | ||
|
|
||
| _symbol_prefix = 'a' |
| self._directions = frozendict(directions) | ||
| directions = directions or {} | ||
| directions = {d: v for d, v in directions.items() if d in self.intervals} | ||
| directions.update({i.dim: Any for i in self.intervals |
There was a problem hiding this comment.
Would it be worth renaming the direction Any to avoid potential squatting on typing.Any?
There was a problem hiding this comment.
we should rather ask the python developers to revisit their type hinting crazyness 😂
| self._directions = frozendict(directions) | ||
|
|
||
| def __repr__(self): | ||
| ret = ', '.join(["%s%s" % (repr(i), repr(self.directions[i.dim])) |
|
|
||
| frees = obj._C_free | ||
|
|
||
| if obj.free_symbols - {obj}: |
There was a problem hiding this comment.
kwargs = {'objs' if obj.free_symbols - {obj} else 'standalones': definition,
efuncs': efuncs, 'frees': frees}
storage.update(obj, site, **kwargs)perhaps?
| self.tensor = tensor | ||
|
|
||
| def _hashable_content(self): | ||
| return super()._hashable_content() + (self.tensor,) |
There was a problem hiding this comment.
But couldn't that potentially cause key clashes if you somehow had both a FunctionMap and its tensor in the same mapper/set? I would say the current one is safer imo
|
Closing in favor to #2748 since I've just finished a massive rebase |
Extended version of #2708